A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints

نویسندگان

  • Ion Necoara
  • Andrei Patrascu
چکیده

In this paper we propose a variant of the random coordinate descent method for solving linearly constrained convex optimization problems with composite objective functions. If the smooth part of the objective function has Lipschitz continuous gradient, then we prove that our method obtains an ε-optimal solution in O(N/ε) iterations, where N is the number of blocks. For the class of problems with cheap coordinate derivatives we show that the new method is faster than methods based on full-gradient information. Analysis for the rate of convergence in probability is also provided. For strongly convex functions our method converges linearly. Extensive numerical tests confirm that on very large problems, our method is much more numerically efficient than methods based on full gradient information.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Random Coordinate Descent Method on Large-scale Optimization Problems with Linear Constraints

In this paper we develop a random block coordinate descent method for minimizing large-scale convex problems with linearly coupled constraints and prove that it obtains in expectation an ε-accurate solution in at most O( 1 ε ) iterations. However, the numerical complexity per iteration of the new method is usually much cheaper than that of methods based on full gradient information. We focus on...

متن کامل

Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization

In this paper we analyze several new methods for solving nonconvex optimization problems with the objective function formed as a sum of two terms: one is nonconvex and smooth, and another is convex but simple and its structure is known. Further, we consider both cases: unconstrained and linearly constrained nonconvex problems. For optimization problems of the above structure, we propose random ...

متن کامل

Linear Objective Function Optimization with the Max-product Fuzzy Relation Inequality Constraints

In this paper, an optimization problem with a linear objective function subject to a consistent finite system of fuzzy relation inequalities using the max-product composition is studied. Since its feasible domain is non-convex, traditional linear programming methods cannot be applied to solve it. We study this problem and capture some special characteristics of its feasible domain and optimal s...

متن کامل

A Random Coordinate Descent Algorithm for Singly Linear Constrained Smooth Optimization∗

In this paper we develop a novel randomized block-coordinate descent method for minimizing multi-agent convex optimization problems with singly linear coupled constraints over networks and prove that it obtains in expectation an ε accurate solution in at most O( 1 λ2(Q)ε ) iterations, where λ2(Q) is the second smallest eigenvalue of a matrix Q that is defined in terms of the probabilities and t...

متن کامل

Random Block Coordinate Descent Methods for Linearly Constrained Optimization over Networks

In this paper we develop random block coordinate descent methods for minimizing large-scale linearly constrained convex problems over networks. Since coupled constraints appear in the problem, we devise an algorithm that updates in parallel at each iteration at least two random components of the solution, chosen according to a given probability distribution. Those computations can be performed ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 57  شماره 

صفحات  -

تاریخ انتشار 2014